255 research outputs found
On Decoding Schemes for the MDPC-McEliece Cryptosystem
Recently, it has been shown how McEliece public-key cryptosystems based on
moderate-density parity-check (MDPC) codes allow for very compact keys compared
to variants based on other code families. In this paper, classical (iterative)
decoding schemes for MPDC codes are considered. The algorithms are analyzed
with respect to their error-correction capability as well as their resilience
against a recently proposed reaction-based key-recovery attack on a variant of
the MDPC-McEliece cryptosystem by Guo, Johansson and Stankovski (GJS). New
message-passing decoding algorithms are presented and analyzed. Two proposed
decoding algorithms have an improved error-correction performance compared to
existing hard-decision decoding schemes and are resilient against the GJS
reaction-based attack for an appropriate choice of the algorithm's parameters.
Finally, a modified belief propagation decoding algorithm that is resilient
against the GJS reaction-based attack is presented
Protograph-Based LDPC Code Design for Shaped Bit-Metric Decoding
A protograph-based low-density parity-check (LDPC) code design technique for
bandwidth-efficient coded modulation is presented. The approach jointly
optimizes the LDPC code node degrees and the mapping of the coded bits to the
bit-interleaved coded modulation (BICM) bit-channels. For BICM with uniform
input and for BICM with probabilistic shaping, binary-input symmetric-output
surrogate channels for the code design are used. The constructed codes for
uniform inputs perform as good as the multi-edge type codes of Zhang and
Kschischang (2013). For 8-ASK and 64-ASK with probabilistic shaping, codes of
rates 2/3 and 5/6 with blocklength 64800 are designed, which operate within
0.63dB and 0.69dB of continuous AWGN capacity for a target frame error rate of
1e-3 at spectral efficiencies of 1.38 and 4.25 bits/channel use, respectively.Comment: 9 pages, 10 figures. arXiv admin note: substantial text overlap with
arXiv:1501.0559
High-Throughput Random Access via Codes on Graphs
Recently, contention resolution diversity slotted ALOHA (CRDSA) has been
introduced as a simple but effective improvement to slotted ALOHA. It relies on
MAC burst repetitions and on interference cancellation to increase the
normalized throughput of a classic slotted ALOHA access scheme. CRDSA allows
achieving a larger throughput than slotted ALOHA, at the price of an increased
average transmitted power. A way to trade-off the increment of the average
transmitted power and the improvement of the throughput is presented in this
paper. Specifically, it is proposed to divide each MAC burst in k sub-bursts,
and to encode them via a (n,k) erasure correcting code. The n encoded
sub-bursts are transmitted over the MAC channel, according to specific
time/frequency-hopping patterns. Whenever n-e>=k sub-bursts (of the same burst)
are received without collisions, erasure decoding allows recovering the
remaining e sub-bursts (which were lost due to collisions). An interference
cancellation process can then take place, removing in e slots the interference
caused by the e recovered sub-bursts, possibly allowing the correct decoding of
sub-bursts related to other bursts. The process is thus iterated as for the
CRDSA case.Comment: Presented at the Future Network and MobileSummit 2010 Conference,
Florence (Italy), June 201
Caching at the Edge with Fountain Codes
We address the use of linear randon fountain codes caching schemes in a
heterogeneous satellite network. We consider a system composed of multiple hubs
and a geostationary Earth orbit satellite. Coded content is memorized in hubs'
caches in order to serve immediately the user requests and reduce the usage of
the satellite backhaul link. We derive the analytical expression of the average
backhaul rate, as well as a tight upper bound to it with a simple expression.
Furthermore, we derive the optimal caching strategy which minimizes the average
backhaul rate and compare the performance of the linear random fountain code
scheme to that of a scheme using maximum distance separable codes. Our
simulation results indicate that the performance obtained using fountain codes
is similar to that of maximum distance separable codes
Coded Slotted ALOHA: A Graph-Based Method for Uncoordinated Multiple Access
In this paper, a random access scheme is introduced which relies on the
combination of packet erasure correcting codes and successive interference
cancellation (SIC). The scheme is named coded slotted ALOHA. A bipartite graph
representation of the SIC process, resembling iterative decoding of generalized
low-density parity-check codes over the erasure channel, is exploited to
optimize the selection probabilities of the component erasure correcting codes
via density evolution analysis. The capacity (in packets per slot) of the
scheme is then analyzed in the context of the collision channel without
feedback. Moreover, a capacity bound is developed and component code
distributions tightly approaching the bound are derived.Comment: The final version to appear in IEEE Trans. Inf. Theory. 18 pages, 10
figure
Ultra-Sparse Non-Binary LDPC Codes for Probabilistic Amplitude Shaping
This work shows how non-binary low-density parity-check codes over GF()
can be combined with probabilistic amplitude shaping (PAS) (B\"ocherer, et al.,
2015), which combines forward-error correction with non-uniform signaling for
power-efficient communication. Ultra-sparse low-density parity-check codes over
GF(64) and GF(256) gain 0.6 dB in power efficiency over state-of-the-art binary
LDPC codes at a spectral efficiency of 1.5 bits per channel use and a
blocklength of 576 bits. The simulation results are compared to finite length
coding bounds and complemented by density evolution analysis.Comment: Accepted for Globecom 201
Bit-Metric Decoding of Non-Binary LDPC Codes with Probabilistic Amplitude Shaping
A new approach for combining non-binary low-density parity-check (NB-LDPC)
codes with higher-order modulation and probabilistic amplitude shaping (PAS) is
presented. Instead of symbol-metric decoding (SMD), a bit-metric decoder (BMD)
is used so that matching the field order of the non-binary code to the
constellation size is not needed, which increases the flexibility of the coding
scheme. Information rates, density evolution thresholds and finite-length
simulations show that the flexibility comes at no loss of performance if PAS is
used.Comment: Accepted for IEEE Communication Letter
Caching at the Edge with LT codes
We study the performance of caching schemes based on LT under peeling
(iterative) decoding algorithm. We assume that users ask for downloading
content to multiple cache-aided transmitters. Transmitters are connected
through a backhaul link to a master node while no direct link exists between
users and the master node. Each content is fragmented and coded with LT code.
Cache placement at each transmitter is optimized such that transmissions over
the backhaul link is minimized. We derive a closed form expression for the
calculation of the backhaul transmission rate. We compare the performance of a
caching scheme based on LT with respect to a caching scheme based on maximum
distance separable codes. Finally, we show that caching with \acl{LT} codes
behave as good as caching with maximum distance separable codes
Finite Length Analysis of Irregular Repetition Slotted ALOHA in the Waterfall Region
A finite length analysis is introduced for irregular repetition slotted ALOHA
(IRSA) that enables to accurately estimate its performance in the
moderate-to-high packet loss probability regime, i.e., in the so-called
waterfall region. The analysis is tailored to the collision channel model,
which enables mapping the description of the successive interference
cancellation process onto the iterative erasure decoding of low-density
parity-check codes. The analysis provides accurate estimates of the packet loss
probability of IRSA in the waterfall region as demonstrated by Monte Carlo
simulations.Comment: Accepted for publication in the IEEE Communications Letter
Inactivation Decoding of LT and Raptor Codes: Analysis and Code Design
In this paper we analyze LT and Raptor codes under inactivation decoding. A
first order analysis is introduced, which provides the expected number of
inactivations for an LT code, as a function of the output distribution, the
number of input symbols and the decoding overhead. The analysis is then
extended to the calculation of the distribution of the number of inactivations.
In both cases, random inactivation is assumed. The developed analytical tools
are then exploited to design LT and Raptor codes, enabling a tight control on
the decoding complexity vs. failure probability trade-off. The accuracy of the
approach is confirmed by numerical simulations.Comment: Accepted for publication in IEEE Transactions on Communication
- …